216 research outputs found

    Building Combined Classifiers

    Get PDF
    This chapter covers different approaches that may be taken when building an ensemble method, through studying specific examples of each approach from research conducted by the authors. A method called Negative Correlation Learning illustrates a decision level combination approach with individual classifiers trained co-operatively. The Model level combination paradigm is illustrated via a tree combination method. Finally, another variant of the decision level paradigm, with individuals trained independently instead of co-operatively, is discussed as applied to churn prediction in the telecommunications industry

    Walking Through Chengdu

    Get PDF
    Mark Eastwood is a student at Louisiana Tech University studying English. He likes potatoes

    Building well-performing classifier ensembles: model and decision level combination.

    Get PDF
    There is a continuing drive for better, more robust generalisation performance from classification systems, and prediction systems in general. Ensemble methods, or the combining of multiple classifiers, have become an accepted and successful tool for doing this, though the reasons for success are not always entirely understood. In this thesis, we review the multiple classifier literature and consider the properties an ensemble of classifiers - or collection of subsets - should have in order to be combined successfully. We find that the framework of Stochastic Discrimination provides a well-defined account of these properties, which are shown to be strongly encouraged in a number of the most popular/successful methods in the literature via differing algorithmic devices. This uncovers some interesting and basic links between these methods, and aids understanding of their success and operation in terms of a kernel induced on the training data, with form particularly well suited to classification. One property that is desirable in both the SD framework and in a regression context, the ambiguity decomposition of the error, is de-correlation of individuals. This motivates the introduction of the Negative Correlation Learning method, in which neural networks are trained in parallel in a way designed to encourage de-correlation of the individual networks. The training is controlled by a parameter λ governing the extent to which correlations are penalised. Theoretical analysis of the dynamics of training results in an exact expression for the interval in which we can choose λ while ensuring stability of the training, and a value λ∗ for which the training has some interesting optimality properties. These values depend only on the size N of the ensemble. Decision level combination methods often result in a difficult to interpret model, and NCL is no exception. However in some applications, there is a need for understandable decisions and interpretable models. In response to this, we depart from the standard decision level combination paradigm to introduce a number of model level combination methods. As decision trees are one of the most interpretable model structures used in classification, we chose to combine structure from multiple individual trees to build a single combined model. We show that extremely compact, well performing models can be built in this way. In particular, a generalisation of bottom-up pruning to a multiple-tree context produces good results in this regard. Finally, we develop a classification system for a real-world churn prediction problem, illustrating some of the concepts introduced in the thesis, and a number of more practical considerations which are of importance when developing a prediction system for a specific problem

    Selling the nuclear state: John F. Kennedy, national security and nuclear testing

    Get PDF
    This thesis assesses the techniques of nuclear salesmanship by which the American public have been conditioned to support and accept the centrality of nuclear weapons to national security. It does this by examining in detail the changing policy of President John F. Kennedy toward nuclear testing during his short presidency and the methods by which he successfully moved public support from a moratorium on testing, to a resumption of testing, and back again to support the signing and ratification of the Partial Test Ban Treaty in 1963. In doing so, it argues that the Kennedy years mark a crucial moment in the history of nuclear salesmanship in which new techniques were inaugurated which had a lasting effect on the American psyche and stilted the growth of the national anti-nuclear movement until the 1980s. ‘Selling the Nuclear State: John F. Kennedy, National Security and Nuclear Testing’ addresses the innovative techniques of nuclear salesmanship which President Kennedy adopted, centred on a dual-track approach which balanced notions of national security with concerns over individual safety. This balancing act allowed Kennedy to appease concerns over his testing policy from both conservatives and liberals. Crucially, the thesis also charts how the burgeoning anti-nuclear movement challenged and subverted Kennedy’s efforts, particularly in the wake of the Cuban Missile Crisis. In 1963, Kennedy would secure the Partial Test Ban Treaty which, on the surface, appeared to be a victory for activists and those concerned with nuclear testing. However, the treaty was actually the apotheosis of Kennedy’s nuclear salesmanship, appeasing fears over safety from nuclear testing whilst allowing the continued development of nuclear weapons. Accordingly, Kennedy inaugurated a policy of utilising limited arms control agreements as a domestic political tool and a key component of nuclear salesmanship. In highlighting the domestic importance of the test ban, the thesis not only addresses the history of nuclear salesmanship but underscores the importance of domestic politics in the broader history of U.S. foreign relations. Organised chronologically, each chapter of the thesis charts the changing methods of nuclear salesmanship of John F. Kennedy. It does so by putting a top down study of policy making into dialogue with a bottom up study of anti-nuclear activism to establish how Kennedy adopted his strategies in relation to increasing popular concerns. In highlighting how Kennedy implemented and developed a dual-track approach to nuclear salesmanship, centred on national security and personal safety, the thesis ultimately explains how Kennedy was able to successfully appease and silence anti-nuclear activism whilst allowing nuclear weapons to remain central to American Cold War strategy. The techniques Kennedy developed would be taken on and employed by many of his successors, underscoring the importance of the Kennedy years to the history of nuclear salesmanship

    Lambda as a Complexity Control in Negative Correlation Learning

    Get PDF
    The parameter in negative correlation learning (NC) controls the degree of co-operation between individual networks. This paper looks at the way the choice of in the NC algorithm affects the complexity of the function NC can fit, and shows that it acts as a complexity control allowing smooth adjustment of the network between one large back-propagated network, and many independent networks individually trained before combination. The effect of the base complexity of the individual networks, and the number of networks that are trained co-operatively, on the algorithms overall complexity are also empirically investigated. Empirical results are presented from 4 different datatsets. The way in which the performance changes as parameters change, and the point at which over-fitting sets in give us information about the complexity of the model the algorithm can fit at those parameter settings

    Building well-performing classifier ensembles : model and decision level combination

    Get PDF
    There is a continuing drive for better, more robust generalisation performance from classification systems, and prediction systems in general. Ensemble methods, or the combining of multiple classifiers, have become an accepted and successful tool for doing this, though the reasons for success are not always entirely understood. In this thesis, we review the multiple classifier literature and consider the properties an ensemble of classifiers - or collection of subsets - should have in order to be combined successfully. We find that the framework of Stochastic Discrimination provides a well-defined account of these properties, which are shown to be strongly encouraged in a number of the most popular/successful methods in the literature via differing algorithmic devices. This uncovers some interesting and basic links between these methods, and aids understanding of their success and operation in terms of a kernel induced on the training data, with form particularly well suited to classification. One property that is desirable in both the SD framework and in a regression context, the ambiguity decomposition of the error, is de-correlation of individuals. This motivates the introduction of the Negative Correlation Learning method, in which neural networks are trained in parallel in a way designed to encourage de-correlation of the individual networks. The training is controlled by a parameter λ governing the extent to which correlations are penalised. Theoretical analysis of the dynamics of training results in an exact expression for the interval in which we can choose λ while ensuring stability of the training, and a value λ∗ for which the training has some interesting optimality properties. These values depend only on the size N of the ensemble. Decision level combination methods often result in a difficult to interpret model, and NCL is no exception. However in some applications, there is a need for understandable decisions and interpretable models. In response to this, we depart from the standard decision level combination paradigm to introduce a number of model level combination methods. As decision trees are one of the most interpretable model structures used in classification, we chose to combine structure from multiple individual trees to build a single combined model. We show that extremely compact, well performing models can be built in this way. In particular, a generalisation of bottom-up pruning to a multiple-tree context produces good results in this regard. Finally, we develop a classification system for a real-world churn prediction problem, illustrating some of the concepts introduced in the thesis, and a number of more practical considerations which are of importance when developing a prediction system for a specific problem.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    New approaches to investigating the function of mycelial networks

    Get PDF
    Fungi play a key role in ecosystem nutrient cycles by scavenging, concentrating, translocating and redistributing nitrogen. To quantify and predict fungal nitrogen redistribution, and assess the importance of the integrity of fungal networks in soil for ecosystem function, we need better understanding of the structures and processes involved. Until recently nitrogen translocation has been experimentally intractable owing to the lack of a suitable radioisotope tracer for nitrogen, and the impossibility of observing nitrogen translocation in real time under realistic conditions. We have developed an imaging method for recording the magnitude and direction of amino acid flow through the whole mycelial network as it captures, assimilates and channels its carbon and nitrogen resources, while growing in realistically heterogeneous soil microcosms. Computer analysis and modeling, based on these digitized video records, can reveal patterns in transport that suggest experimentally testable hypotheses. Experimental approaches that we are developing include genomics and stable isotope NMR to investigate where in the system nitrogen compounds are being acquired and stored, and where they are mobilized for transport or broken down. The results are elucidating the interplay between environment, metabolism, and the development and function of transport networks as mycelium forages in soil. The highly adapted and selected foraging networks of fungi may illuminate fundamental principles applicable to other supply networks

    OPAL Community Environment Report

    Get PDF
    The Open Air Laboratories network, or OPAL, as it quickly became known, was launched in 2007 following a successful application to the Big Lottery Fund It was the first time that Big Lottery funding on this scale had been awarded to academic institutions. The University of Central Lancashire led by Dr Mark Toogood was responsible for understanding public engagement with OPAL. The Open Air Laboratories (OPAL)network is a nationwide partnership comprising of ten universities and five organisations with grants awarded totalling £14.4 million. • Over half a million people have participated in the OPAL programme. OPAL activities are carried out by people of all ages, backgrounds and abilities, including 10,000 people in ‘hard to reach’ communities. • OPAL opens people’s eyes to the natural world. Nearly half (44%) of OPAL survey participants said that this was the first time that they had carried out a nature survey. 90% of participants have learnt something new. • OPAL has the ability to change people’s behaviour. Almost half (43%) of respondents said OPAL had changed the way they thought about the environment and more than a third (37%) said they will change their behaviour towards it. • In addition to raising environmental awareness, OPAL also improves personal well-being by motivating people to spend time outdoors doing something positive, while connecting with people and nature

    Conditional Random Field Feature Generation of Smart Home Sensor Data using Random Forests

    Get PDF
    A typical approach to building a feature set for a conditionalrandom field model is to build a large set of conjunctions of atomic tests, all of which adhere to a small number of relatively simple templates. Building more complex features in this way can be difficult, as the more complex templates needed to do this can result in a combinatoric explosion in the number of features. We use the inherent instability of decision trees to produce a small set of more complex conjunctions that are particularly suitable for the problem to be solved, using the same techniques used in generating random forest ensemble classifiers, andbuild a CRF on these features. We apply this method to an activityrecognition problem on a dataset from the CASAS smart home project, in which we predict activities of daily living from sensor activations

    Workplace-based interventions to promote healthy lifestyles in the NHS workforce : a rapid scoping and evidence map

    Get PDF
    Background:The health and well-being of staff working in the NHS is a significant issue for UK health care. We sought to identify research relevant to the promotion of healthy lifestyles among NHS staff on behalf of NHS England. Objectives:To map existing reviews on workplace-based interventions to promote health and well-being, and to assess the scope for further evidence synthesis work. Design:Rapid and responsive scoping search and evidence map. Participants:Adult employees in any occupational setting and in any role. Interventions:Any intervention aimed at promoting or maintaining physical or mental health and well-being. Early intervention initiatives and those addressing violence against staff, workplace bullying or harassment were also included. Main outcome measures:Any outcome related to the effectiveness, cost-effectiveness or implementation of interventions.Data sources:A scoping search of nine databases was conducted to identify systematic reviews on health and well-being at work. Searches were limited by publication date (2000 to January/February 2019). Review methods:The titles and abstracts of over 8241 records were screened and a total of 408 potentially relevant publications were identified. Information on key characteristics were extracted from the titles and abstracts of all potentially relevant publications. Descriptive statistics (counts and percentages) for key characteristics were generated and data from reviews and ‘reviews of reviews’ were used to produce the evidence map. Results:Evidence related to a broad range of physical and mental health issues was identified across 12 ‘reviews of reviews’ and 312 other reviews, including 16 Cochrane reviews. There also exists National Institute for Health and Care Excellence guidance addressing multiple issues of potential relevance. A large number of reviews focused on mental health, changing lifestyle behaviour, such as physical activity, or on general workplace health/health promotion. Most of the reviews that focused only on health-care staff addressed mental health issues, and stress/burnout in particular. Limitations:The scoping search process was extensive and clearly effective at identifying relevant publications, but the strategy used may not have identified every potentially relevant review. Owing to the large number of potentially relevant reviews identified from the scoping search, it was necessary to produce the evidence map using information from the titles and abstracts of reviews only. Conclusions: It is doubtful that further evidence synthesis work at this stage would generate substantial new knowledge, particularly within the context of the NHS Health and Wellbeing Framework published in 2018. Additional synthesis work may be useful if it addressed an identifiable need and it was possible to identify one of the following: (1) a specific and focused research question arising from the current evidence map; it may then be appropriate to focus on a smaller number of reviews only, and provide a more thorough and critical assessment of the available evidence; and (2) a specific gap in the literature (i.e. an issue not already addressed by existing reviews or guidance); it may then be possible to undertake further literature searching and conduct a new evidence review
    • …
    corecore